Bias , Variance , and Error Correcting Output Codes forLocal Learners ?

نویسندگان

  • Francesco Ricci
  • David W. Aha
چکیده

This paper focuses on a bias variance decomposition analysis of a local learning algorithm, the nearest neighbor classiier, that has been extended with error correcting output codes. This extended algorithm often considerably reduces the 0-1 (i.e., classiication) error in comparison with nearest neighbor (Ricci & Aha, 1997). The analysis presented here reveals that this performance improvement is obtained by drastically reducing bias at the cost of increasing variance. We also show that, even in classiication problems with few classes (m5), extending the codeword length beyond the limit that assures column separation yields an error reduction. This error reduction is not only in the variance, which is due to the voting mechanism used for error-correcting output codes, but also in the bias. Abstract. This paper focuses on a bias variance decomposition analysis of a local learning algorithm, the nearest neighbor classiier, that has been extended with error correcting output codes. This extended algorithm often considerably reduces the 0-1 (i.e., classiication) error in comparison with nearest neighbor (Ricci & Aha, 1997). The analysis presented here reveals that this performance improvement is obtained by drastically reducing bias at the cost of increasing variance. We also show that, even in classiication problems with few classes (m5), extending the codeword length beyond the limit that assures column separation yields an error reduction. This error reduction is not only in the variance, which is due to the voting mechanism used for error-correcting output codes, but also in the bias.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Extending Local Learners with Error-correcting Output Codes Extending Local Learners with Error-correcting Output Codes

Error-correcting output codes (ECOCs) represent classes with a set of output bits, where each bit encodes a binary classiication task corresponding to a unique partition of the classes. Algorithms that use ECOCs learn the function corresponding to each bit, and combine them to generate class predictions. ECOCs can reduce both variance and bias errors for multiclass classiication tasks when the ...

متن کامل

Error-Correcting Output Codes for Local Learners

Error-correcting output codes (ECOCs) represent classes with a set of output bits, where each bit encodes a binary classiication task corresponding to a unique partition of the classes. Algorithms that use ECOCs learn the function corresponding to each bit, and combine them to generate class predictions. ECOCs can reduce both variance and bias errors for multiclass classiication tasks when the ...

متن کامل

Cloud Classi cation Using Error-Correcting Output Codes

Novel arti cial intelligence methods are used to classify 16x16 pixel regions (obtained from Advanced Very High Resolution Radiometer (AVHRR) images) in terms of cloud type (e.g., stratus, cumulus, etc.). We previously reported that intelligent feature selection methods, combined with nearest neighbor classi ers, can dramatically improve classi cation accuracy on this task. Our subsequent analy...

متن کامل

An approach to fault detection and correction in design of systems using of Turbo ‎codes‎

We present an approach to design of fault tolerant computing systems. In this paper, a technique is employed that enable the combination of several codes, in order to obtain flexibility in the design of error correcting codes. Code combining techniques are very effective, which one of these codes are turbo codes. The Algorithm-based fault tolerance techniques that to detect errors rely on the c...

متن کامل

The Bias Variance Trade-Off in Bootstrapped Error Correcting Output Code Ensembles

By performing experiments on publicly available multi-class datasets we examine the effect of bootstrapping on the bias/variance behaviour of error-correcting output code ensembles. We present evidence to show that the general trend is for bootstrapping to reduce variance but to slightly increase bias error. This generally leads to an improvement in the lowest attainable ensemble error, however...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1997